end-to-end differentiable rule mining
DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs
In this paper, we study the problem of learning probabilistic logical rules for inductive and interpretable link prediction. Despite the importance of inductive link prediction, most previous works focused on transductive link prediction and cannot manage previously unseen entities. Moreover, they are black-box models that are not easily explainable for humans. We propose DRUM, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves these problems. We motivate our method by making a connection between learning confidence scores for each rule and low-rank tensor approximation. DRUM uses bidirectional RNNs to share useful information across the tasks of learning rules for different relations. We also empirically demonstrate the efficiency of DRUM over existing rule mining methods for inductive link prediction on a variety of benchmark datasets.
Reviews: DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs
Authors propose DRUM, an end-to-end differentiable rule-based inference method which can be used for mining rules via backprop, and extracting rules from data. Their approach is quite interesting - it can be trained from positive examples only, without negative sampling (this is currently a burden for representation learning algorithms targeting knowledge graphs). In DRUM, paths in a knowledge graph are represented by a chain of matrix multiplications (this idea is not especially novel - see [1]). For mining rules, authors start from a formulation of the problem where each rule is associated with a confidence weight, and try to maximise the likelihood of training triples by optimising an end-to-end differentiable objective. However, the space of possible rules (and thus the number of parameters as confidence scores) is massive, so authors propose a way of efficiently approximating the rule scores tensor using with another having a lower rank (Eq.
- Information Technology > Artificial Intelligence > Representation & Reasoning > Semantic Networks (0.85)
- Information Technology > Artificial Intelligence > Representation & Reasoning > Rule-Based Reasoning (0.80)
- Information Technology > Artificial Intelligence > Machine Learning > Neural Networks (0.58)
Reviews: DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs
This paper proposes an interesting approach for differentiable, interpretable rule mining given a knowledge base. The major pro of the approach is its in an inductive setting without the need for negative examples, which excited the reviewers. Initially the paper lacked important comparisons to many related works, but the author did a good job in rebuttal. Please include the comparison results in the final version and the results on other datasets pointed out by the reviewers. I would like to recommend an acceptance to NeurIPS.
DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs
In this paper, we study the problem of learning probabilistic logical rules for inductive and interpretable link prediction. Despite the importance of inductive link prediction, most previous works focused on transductive link prediction and cannot manage previously unseen entities. Moreover, they are black-box models that are not easily explainable for humans. We propose DRUM, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves these problems. We motivate our method by making a connection between learning confidence scores for each rule and low-rank tensor approximation. DRUM uses bidirectional RNNs to share useful information across the tasks of learning rules for different relations.
DRUM: End-To-End Differentiable Rule Mining On Knowledge Graphs
Sadeghian, Ali, Armandpour, Mohammadreza, Ding, Patrick, Wang, Daisy Zhe
In this paper, we study the problem of learning probabilistic logical rules for inductive and interpretable link prediction. Despite the importance of inductive link prediction, most previous works focused on transductive link prediction and cannot manage previously unseen entities. Moreover, they are black-box models that are not easily explainable for humans. We propose DRUM, a scalable and differentiable approach for mining first-order logical rules from knowledge graphs that resolves these problems. We motivate our method by making a connection between learning confidence scores for each rule and low-rank tensor approximation.